Asymptotic Behavior of Bayes' Estimates

نویسندگان

چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Variance of Bayes estimates

of signal parameters in the presence of [2] C E. Shannon, " A mathematical theory of communication, " noise, " IRE Trans. [3]-, " Coding theorems for a discrete source with a fidelity [9] P. Swerling, " Maximum angular accuracy of a pulsed search criterion, " IRE Nat. Abstract-This paper contains an analysis of the performance of Bayes conditional-mean parameter estimators. The main result is t...

متن کامل

A Priori Estimates , Geometric Effects and Asymptotic Behavior

Many physical phenomena can be described by a partial differential equation Pu=0. Here P denotes some differential operator or system of such operators, and u, the unknown function, is either a scalar or a vector. The differential equation connects the derivatives of u at each point of its domain D. The mathematician is interested in the global consequences of this local constraint, especially ...

متن کامل

MAD-Bayes: MAP-based Asymptotic Derivations from Bayes

First consider the generative model in Section 2. The joint distribution of the observed data x, cluster indicators z, and cluster means µ can be written as follows. P(x, z, µ) = P(x|z, µ)P(z)P(µ) = K + k=1 n:z n,k =1

متن کامل

MAD-Bayes: MAP-based Asymptotic Derivations from Bayes

The classical mixture of Gaussians model is related to K-means via small-variance asymptotics: as the covariances of the Gaussians tend to zero, the negative log-likelihood of the mixture of Gaussians model approaches the K-means objective, and the EM algorithm approaches the K-means algorithm. Kulis & Jordan (2012) used this observation to obtain a novel K-means-like algorithm from a Gibbs sam...

متن کامل

Inequalities and Asymptotic Estimates

Unless specified otherwise, we use µ, σ 2 to denote the mean and variance of the the variable under consideration. This note shall be updated throughout the seminar as I find more useful inequalities. 1 Basic inequalities Theorem 1.1 (Markov's Inequality). If X is a random variable taking only non-negative values, then for any a > 0 Pr[X ≥ a] ≤ E[X] a. (1) Proof. We show this for the discrete c...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: The Annals of Mathematical Statistics

سال: 1964

ISSN: 0003-4851

DOI: 10.1214/aoms/1177703584